Risk

Risk is the potential that a chosen action or activity (including the choice of inaction) will lead to a loss (an undesirable outcome). The notion implies that a choice having an influence on the outcome exists (or existed). Potential losses themselves may also be called "risks". Almost any human endeavor carries some risk, but some are much more risky than others.

Contents

Historical background

The Oxford English Dictionary cites the earliest use of the word in English (in the spelling of risque) as from 1621, and the spelling as risk from 1655. It defines risk as:

(Exposure to) the possibility of loss, injury, or other adverse or unwelcome circumstance; a chance or situation involving such a possibility.[1]

For the sociologist Niklas Luhmann the term 'risk' is a neologism that appeared with the transition from traditional to modern society.[2] "In the Middle Ages the term risicum was used in highly specific contexts, above all sea trade and its ensuing legal problems of loss and damage."[2][3] In the vernacular languages of the 16th century the words rischio and riezgo were used.[2] This was introduced to continental Europe, through interaction with Middle Eastern and North African Arab traders. In the English language the term risk appeared only in the 17th century, and "seems to be imported from continental Europe."[2] When the terminology of risk took ground, it replaced the older notion that thought "in terms of good and bad fortune."[2] Niklas Luhmann (1996) seeks to explain this transition: "Perhaps, this was simply a loss of plausibility of the old rhetorics of Fortuna as an allegorical figure of religious content and of prudentia as a (noble) virtue in the emerging commercial society."[4]

Scenario analysis matured during Cold War confrontations between major powers, notably the United States and the Soviet Union. It became widespread in insurance circles in the 1970s when major oil tanker disasters forced a more comprehensive foresight. The scientific approach to risk entered finance in the 1960s with the advent of the capital asset pricing model and became increasingly important in the 1980s when financial derivatives proliferated. It reached general professions in the 1990s when the power of personal computing allowed for widespread data collection and numbers crunching.

Governments are using it, for example, to set standards for environmental regulation, e.g. "pathway analysis" as practiced by the United States Environmental Protection Agency.

Definitions of risk

ISO31000:2009 Risk Management Standard

The ISO 31000 (2009) /ISO Guide 73 definition of risk is the 'effect of uncertainty on objectives'. In this definition, uncertainties include events (which may or not happen) and uncertainties caused by a lack of information or ambiguity. It also includes both negative and positive impacts on objectives. Many definitions of risk exist in common usage, however this definition was developed by an international committee representing over 30 countries and is based on the input of several thousand subject matter experts.

Other definitions of risk

The many inconsistent and ambiguous meanings attached to "risk" lead to widespread confusion and also mean that very different approaches to risk management are taken in different fields.[5] For example:

[6]

Risk can be seen as relating to the Probability of uncertain future events.[7]. For example, according to Factor Analysis of Information Risk, risk is:[7] the probable frequency and probable magnitude of future loss. In computer science this definition is used by The Open Group.[8]
OHSAS (Occupational Health & Safety Advisory Services) defines risk as the product of the probability of a hazard resulting in an adverse event, times the severity of the event.[9]
In information security risk is defined as "the potential that a given threat will exploit vulnerabilities of an asset or group of assets and thereby cause harm to the organization",[10]
Financial risk is often defined as the unexpected variability or volatility of returns and thus includes both potential worse-than-expected as well as better-than-expected returns. References to negative risk below should be read as applying to positive impacts or opportunity (e.g., for "loss" read "loss or gain") unless the context precludes this interpretation.
The related terms "threat" and "hazard" are often used to mean something that could cause harm.

Practice Areas

Risk is ubiquitous in all areas of life and risk management is something that we all must do, whether we are managing a major organization or simply crossing the road. When describing risk however, it is convenient to consider that risk practitioners operate in some specific practice areas.

Economic risk

Economic risks can be manifested in lower incomes or higher expenditures than expected. The causes can be many, for instance, the hike in the price for raw materials, the lapsing of deadlines for construction of a new operating facility, disruptions in a production process, emergence of a serious competitor on the market, the loss of key personnel, the change of a political regime, or natural disasters.[11] Reference class forecasting was developed to eliminate or reduce economic risk.[12]

Health

Risks in personal health may be reduced by primary prevention actions that decrease early causes of illness or by secondary prevention actions after a person has clearly measured clinical signs or symptoms recognized as risk factors. Tertiary prevention reduces the negative impact of an already established disease by restoring function and reducing disease-related complications. Ethical medical practice requires careful discussion of risk factors with individual patients to obtain informed consent for secondary and tertiary prevention efforts, whereas public health efforts in primary prevention require education of the entire population at risk. In each case, careful communication about risk factors, likely outcomes and certainty must distinguish between causal events that must be decreased and associated events that may be merely consequences rather than causes.

In epidemiology, the lifetime risk of an effect is the cumulative incidence, also called incidence proportion over an entire lifetime.[13]

Health, Safety and Environment

Health, Safety and Environment (HSE) are separate practice areas, however they are often linked. The reason for this is typically to do with organizational management structures however there are strong links between these disciplines. One of the strongest links between these is that a single risk event may have impacts in all three areas, albeit over differing timescales. For example, the uncontrolled release of radiation or a toxic chemical may have immediate short term safety consequences, more protracted health impacts and much longer term environmental impacts. Events such as Chernobyl for example caused immediate deaths, longer term deaths from cancers and left a lasting environmental impact leading to birth defects, impacts on wildlife, etc.

Information Technology and Information Security

Information technology risk, or IT risk, IT-related risk, is a risk related to information technology. This relatively new term due to an increasing awareness that information security is simply one facet of a multitude of risks that are relevant to IT and the real world processes it supports.

The increasing dependencies of modern society on information and computers networks (both in private and public sectors, including military)[14] [15] [16] has led to a new terms like IT risk and Cyberwarfare.

Information security means protecting information and information systems from unauthorized access, use, disclosure, disruption, modification, perusal, inspection, recording or destruction.[17] Information security grew out of practices and procedures of computer security.
Information security has grown to information assurance (IA) i.e. is the practice of managing risks related to the use, processing, storage, and transmission of information or data and the systems and processes used for those purposes.
While focused dominantly on information in digital form, the full range of IA encompasses not only digital but also analog or physical form.
Information assurance is interdisciplinary and draws from multiple fields, including accounting, fraud examination, forensic science, management science, systems engineering, security engineering, and criminology, in addition to computer science.

So, IT risk is narrowly focused on computer security, while information security extends on risks related to other forms of information (paper, microfilm). Information assurance risks include the ones related to the consistency of the business information stored in IT systems and the one stored on other means and the relevant business consequences.

Insurance

Insurance is a risk treatment option which involves risk sharing. It can be considered as a form of contingent capital and is akin to purchasing an Option (finance) in which the buyer pays a small premium to be protected from a potential large loss.

Business and Management

Means of assessing risk vary widely between professions. Indeed, they may define these professions; for example, a doctor manages medical risk, while a civil engineer manages risk of structural failure. A professional code of ethics is usually focused on risk assessment and mitigation (by the professional on behalf of client, public, society or life in general).

In the workplace, incidental and inherent risks exist. Incidental risks are those that occur naturally in the business but are not part of the core of the business. Inherent risks have a negative effect on the operating profit of the business.

High Reliability Organizations (HROs)

A 'High reliability organization (HRO) is an organization that has succeeded in avoiding catastrophes in an environment where normal accidents can be expected due to risk factors and complexity. Most studies of HROs involve areas such as nuclear aircraft carriers, air traffic control, aerospace and nuclear power stations. Organizations such as these share in common the ability to consistently operate safely in complex, interconnected environments where a single failure in one component could lead to catastrophe. Essentially, they are organizations which appear to operate 'in spite' of an enormous range of risks.

Some of these industries manage risk in a highly quantified and enumerated way. These include the nuclear power and aircraft industries, where the possible failure of a complex series of engineered systems could result in highly undesirable outcomes. The usual measure of risk for a class of events is then: R = probability of the event × C

The total risk is then the product of the individual class-risks.

In the nuclear industry, consequence is often measured in terms of off-site radiological release, and this is often banded into five or six decade-wide bands.

The risks are evaluated using fault tree/event tree techniques (see safety engineering). Where these risks are low, they are normally considered to be "Broadly Acceptable". A higher level of risk (typically up to 10 to 100 times what is considered Broadly Acceptable) has to be justified against the costs of reducing it further and the possible benefits that make it tolerable—these risks are described as "Tolerable if ALARP". Risks beyond this level are classified as "Intolerable".

The level of risk deemed Broadly Acceptable has been considered by regulatory bodies in various countries—an early attempt by UK government regulator and academic F. R. Farmer used the example of hill-walking and similar activities, which have definable risks that people appear to find acceptable. This resulted in the so-called Farmer Curve of acceptable probability of an event versus its consequence.

The technique as a whole is usually referred to as Probabilistic Risk Assessment (PRA) (or Probabilistic Safety Assessment, PSA). See WASH-1400 for an example of this approach.

Finance

In finance, risk is the probability that an investment's actual return will be different than expected. This includes the possibility of losing some or all of the original investment. In a view advocated by Damodaran, risk includes not only "downside risk" but also "upside risk" (returns that exceed expectations).[18] Some regard a calculation of the standard deviation of the historical returns or average returns of a specific investment as providing some historical measure of risk; see modern portfolio theory. Financial risk may be market-dependent, determined by numerous market factors, or operational, resulting from fraudulent behavior (e.g. Bernard Madoff). Recent studies suggest that testosterone level plays a major role in risk taking during financial decisions.[19][20]

In finance, risk has no one definition, but some theorists, notably Ron Dembo, have defined quite general methods to assess risk as an expected after-the-fact level of regret. Such methods have been uniquely successful in limiting interest rate risk in financial markets. Financial markets are considered to be a proving ground for general methods of risk assessment. However, these methods are also hard to understand. The mathematical difficulties interfere with other social goods such as disclosure, valuation and transparency. In particular, it is not always obvious if such financial instruments are "hedging" (purchasing/selling a financial instrument specifically to reduce or cancel out the risk in another investment) or "speculation" (increasing measurable risk and exposing the investor to catastrophic loss in pursuit of very high windfalls that increase expected value).

As regret measures rarely reflect actual human risk-aversion, it is difficult to determine if the outcomes of such transactions will be satisfactory. Risk seeking describes an individual whose utility function's second derivative is positive. Such an individual would willingly (actually pay a premium to) assume all risk in the economy and is hence not likely to exist.

In financial markets, one may need to measure credit risk, information timing and source risk, probability model risk, and legal risk if there are regulatory or civil actions taken as a result of some "investor's regret". Knowing one's risk appetite in conjunction with one's financial well-being are most crucial.

A fundamental idea in finance is the relationship between risk and return (see modern portfolio theory). The greater the potential return one might seek, the greater the risk that one generally assumes. A free market reflects this principle in the pricing of an instrument: strong demand for a safer instrument drives its price higher (and its return proportionately lower), while weak demand for a riskier instrument drives its price lower (and its potential return thereby higher).

"For example, a US Treasury bond is considered to be one of the safest investments and, when compared to a corporate bond, provides a lower rate of return. The reason for this is that a corporation is much more likely to go bankrupt than the U.S. government. Because the risk of investing in a corporate bond is higher, investors are offered a higher rate of return."

The most popular, and also the most vilified lately risk measurement is Value-at-Risk (VaR). There are different types of VaR - Long Term VaR, Marginal VaR, Factor VaR and Shock VaR[21] The latter is used in measuring risk during the extreme market stress conditions.

Security

Security risk management involves protection of assets from harm caused by deliberate acts. A more detailed definition is: "A security risk is any event that could result in the compromise of organizational assets. the unauthorized use, loss, damage, disclosure or modification of organizational assets for the profit, personal interest or political interests of individuals, groups or other entities constitutes a compromise of the asset, and includes the risk of harm to people. Compromise of organizational assets may adversely affect the enterprise, its business units and their clients. As such, consideration of security risk is a vital component of risk management." [22]

Societal Risk

In a peer reviewed study of risk in public works projects located in twenty nations on five continents, Flyvbjerg, Holm, and Buhl (2002, 2005) documented high risks for such ventures for both costs[23] and demand.[24] Actual costs of projects were typically higher than estimated costs; cost overruns of 50% were common, overruns above 100% not uncommon. Actual demand was often lower than estimated; demand shortfalls of 25% were common, of 50% not uncommon.

Due to such cost and demand risks, cost-benefit analyses of public works projects have proved to be highly uncertain.

The main causes of cost and demand risks were found to be optimism bias and strategic misrepresentation. Measures identified to mitigate this type of risk are better governance through incentive alignment and the use of reference class forecasting.[25]

Human Factors

One of the growing areas of focus in risk management is the field of human factors where behavioral and organizational psychology underpin our understanding of risk based decision making. This field considers questions such as "how do we make risk based decisions?", "why are we irrationally more scared of sharks and terrorists than we are of motor vehicles and medications?"

In decision theory, regret (and anticipation of regret) can play a significant part in decision-making, distinct from risk aversion (preferring the status quo in case one becomes worse off).

Framing[26] is a fundamental problem with all forms of risk assessment. In particular, because of bounded rationality (our brains get overloaded, so we take mental shortcuts), the risk of extreme events is discounted because the probability is too low to evaluate intuitively. As an example, one of the leading causes of death is road accidents caused by drunk driving—partly because any given driver frames the problem by largely or totally ignoring the risk of a serious or fatal accident.

For instance, an extremely disturbing event (an attack by hijacking, or moral hazards) may be ignored in analysis despite the fact it has occurred and has a nonzero probability. Or, an event that everyone agrees is inevitable may be ruled out of analysis due to greed or an unwillingness to admit that it is believed to be inevitable. These human tendencies for error and wishful thinking often affect even the most rigorous applications of the scientific method and are a major concern of the philosophy of science.

All decision-making under uncertainty must consider cognitive bias, cultural bias, and notational bias: No group of people assessing risk is immune to "groupthink": acceptance of obviously wrong answers simply because it is socially painful to disagree, where there are conflicts of interest. One effective way to solve framing problems in risk assessment or measurement (although some argue that risk cannot be measured, only assessed) is to raise others' fears or personal ideals by way of completeness.

Framing involves other information that affects the outcome of a risky decision. The right prefrontal cortex has been shown to take a more global perspective[27] while greater left prefrontal activity relates to local or focal processing[28]

From the Theory of Leaky Modules[29] McElroy and Seta proposed that they could predictably alter the framing effect by the selective manipulation of regional prefrontal activity with finger tapping or monaural listening.[30] The result was as expected. Rightward tapping or listening had the effect of narrowing attention such that the frame was ignored. This is a practical way of manipulating regional cortical activation to affect risky decisions, especially because directed tapping or listening is easily done.

Risk assessment and analysis

Because planned actions are subject to large cost and benefit risks, proper risk assessment and risk management for such actions are crucial to making them successful.[31]

Since risk assessment and management is essential in security management, both are tightly related. Security assessment methodologies like CRAMM contain risk assessment modules as an important part of the first steps of the methodology. On the other hand, risk assessment methodologies like Mehari evolved to become security assessment methodologies. A ISO standard on risk management (Principles and guidelines on implementation) was published under code ISO 31000 on 13 November 2009.

Quantitative Analysis

As risk carries so many different meanings there are many formal methods used to assess or to "measure" risk. Some of the quantitative definitions of risk are well-grounded in statistics theory and lead naturally to statistical estimates, but some are more subjective. For example in many cases a critical factor is human decision making.

Even when statistical estimates are available, in many cases risk is associated with rare failures of some kind, and data may be sparse. Often, the probability of a negative event is estimated by using the frequency of past similar events or by event tree methods, but probabilities for rare failures may be difficult to estimate if an event tree cannot be formulated. This makes risk assessment difficult in hazardous industries, for example nuclear energy, where the frequency of failures is rare and harmful consequences of failure are numerous and severe.

Statistical methods may also require the use of a Cost function, which in turn may require the calculation of the cost of loss of a human life. This is a difficult problem. One approach is to ask what people are willing to pay to insure against death[32] or radiological release (e.g. GBq of radio-iodine) needs citation, FeralOink, but as the answers depend very strongly on the circumstances it is not clear that this approach is effective.

In statistics, the notion of risk is often modelled as the expected value of an undesirable outcome. This combines the probabilities of various possible events and some assessment of the corresponding harm into a single value. See also Expected utility. The simplest case is a binary possibility of Accident or No accident. The associated formula for calculating risk is then:

 \text{Risk} = (\text{probability of the accident occurring}) \times  (\text{expected loss in case of the accident})

For example, if performing activity X has a probability of 0.01 of suffering an accident of A, with a loss of 1000, then total risk is a loss of 10, the product of 0.01 and 1000.

Situations are sometimes more complex than the simple binary possibility case. In a situation with several possible accidents, total risk is the sum of the risks for each different accident, provided that the outcomes are comparable:

 \text{Risk} = \text{For all accidents} \sum (\text{probability of the accident occurring}) \times  (\text{expected loss in case of the accident})

For example, if performing activity X has a probability of 0.01 of suffering an accident of A, with a loss of 1000, and a probability of 0.000001 of suffering an accident of type B, with a loss of 2,000,000, then total risk is a loss of 12, which is equal to a loss of 10 from an accident of type A and 2 from an accident of type B.

One of the first major uses of this concept was for the planning of the Delta Works in 1953, a flood protection program in the Netherlands, with the aid of the mathematician David van Dantzig.[33] The kind of risk analysis pioneered there has become common today in fields like nuclear power, aerospace and the chemical industry.

In statistical decision theory, the risk function is defined as the expected value of a given loss function as a function of the decision rule used to make decisions in the face of uncertainty.

Fear as intuitive risk assessment

For the time being, people rely on their fear and hesitation to keep them out of the most profoundly unknown circumstances.

In The Gift of Fear, Gavin de Becker argues that

True fear is a gift. It is a survival signal that sounds only in the presence of danger. Yet unwarranted fear has assumed a power over us that it holds over no other creature on Earth. It need not be this way.

Risk could be said to be the way we collectively measure and share this "true fear"—a fusion of rational doubt, irrational fear, and a set of unquantified biases from our own experience.

The field of behavioral finance focuses on human risk-aversion, asymmetric regret, and other ways that human financial behavior varies from what analysts call "rational". Risk in that case is the degree of uncertainty associated with a return on an asset.

Recognizing and respecting the irrational influences on human decision making may do much to reduce disasters caused by naive risk assessments that presume to rationality but in fact merely fuse many shared biases.

Risk in auditing

The audit risk model expresses the risk of an auditor providing an inappropriate opinion of a commercial entity's financial statements. It can be analytically expressed as:

AR = IR x CR x DR

Where AR is audit risk, IR is inherent risk, CR is control risk and DR is detection risk.

In human services

Huge ethical and political issues arise when human beings themselves are seen or treated as 'risks', or when the risk decision making of people who use human services might have an impact on that service. The experience of many people who rely on human services for support is that 'risk' is often used as a reason to prevent them from gaining further independence or fully accessing the community, and that these services are often unnecessarily risk averse.[34]

Other Considerations

Another consideration in terms of managing risk, is that risks are future problems that can be treated, rather than current ones that must be immediately addressed.

Risk versus uncertainty

In his seminal work Risk, Uncertainty, and Profit, Frank Knight (1921) established the distinction between risk and uncertainty.

... Uncertainty must be taken in a sense radically distinct from the familiar notion of Risk, from which it has never been properly separated. The term "risk," as loosely used in everyday speech and in economic discussion, really covers two things which, functionally at least, in their causal relations to the phenomena of economic organization, are categorically different. ... The essential fact is that "risk" means in some cases a quantity susceptible of measurement, while at other times it is something distinctly not of this character; and there are far-reaching and crucial differences in the bearings of the phenomenon depending on which of the two is really present and operating. ... It will appear that a measurable uncertainty, or "risk" proper, as we shall use the term, is so far different from an unmeasurable one that it is not in effect an uncertainty at all. We ... accordingly restrict the term "uncertainty" to cases of the non-quantitive type.:[35]

Thus, Knightian uncertainty is immeasurable, not possible to calculate, while in the Knightian sense risk is measurable.

Another distinction between risk and uncertainty is proposed in How to Measure Anything: Finding the Value of Intangibles in Business and The Failure of Risk Management: Why It's Broken and How to Fix It by Doug Hubbard:[36][37]

Uncertainty: The lack of complete certainty, that is, the existence of more than one possibility. The "true" outcome/state/result/value is not known.
Measurement of uncertainty: A set of probabilities assigned to a set of possibilities. Example: "There is a 60% chance this market will double in five years"
Risk: A state of uncertainty where some of the possibilities involve a loss, catastrophe, or other undesirable outcome.
Measurement of risk: A set of possibilities each with quantified probabilities and quantified losses. Example: "There is a 40% chance the proposed oil well will be dry with a loss of $12 million in exploratory drilling costs".

In this sense, Hubbard uses the terms so that one may have uncertainty without risk but not risk without uncertainty. We can be uncertain about the winner of a contest, but unless we have some personal stake in it, we have no risk. If we bet money on the outcome of the contest, then we have a risk. In both cases there are more than one outcome. The measure of uncertainty refers only to the probabilities assigned to outcomes, while the measure of risk requires both probabilities for outcomes and losses quantified for outcomes.

Risk Attitude

The terms attitude, appetite and tolerance are often used similarly to describe an organization's or individual's attitude towards risk taking. Risk averse, risk neutral and risk seeking are examples of the terms that may be used to describe a risk attitude.

Gambling is a risk-increasing investment, wherein money on hand is risked for a possible large return, but with the possibility of losing it all. Purchasing a lottery ticket is a very risky investment with a high chance of no return and a small chance of a very high return. In contrast, putting money in a bank at a defined rate of interest is a risk-averse action that gives a guaranteed return of a small gain and precludes other investments with possibly higher gain.

Risk as a vector quantity

Hubbard also argues that defining risk as the product of impact and probability presumes (probably incorrectly) that the decision makers are risk neutral.[37] Only for a risk neutral person is the "certain monetary equivalent" exactly equal to the probability of the loss times the amount of the loss. For example, a risk neutral person would consider 20% chance of winning $1 million exactly equal to $200,000 (or a 20% chance of losing $1 million to be exactly equal to losing $200,000). However, most decision makers are not actually risk neutral and would not consider these equivalent choices. This gave rise to Prospect theory and Cumulative prospect theory. Hubbard proposes instead that risk is a kind of "vector quantity" that does not collapse the probability and magnitude of a risk by presuming anything about the risk tolerance of the decision maker. Risks are simply described as a set or function of possible loss amounts each associated with specific probabilities. How this array is collapsed into a single value cannot be done until the risk tolerance of the decision maker is quantified.

Risk can be both negative and positive, but it tends to be the negative side that people focus on. This is because some things can be dangerous, such as putting their own or someone else’s life at risk. Risks concern people as they think that they will have a negative effect on their future.

Risk and size

In the book Megaprojects and Risk, Professor Bent Flyvbjerg (with Nils Bruzelius and Werner Rothengatter) demonstrates that big ventures (big construction projects, big capital investments, etc.) are highly risky. For instance, such ventures typically have high cost overruns, benefit shortfalls, and schedule delays, plus negative and unanticipated social and environmental impacts.[38]

Further reading

This is a list of books about risk issues.

Title Author(s) Year
Acceptable risk Baruch Fischhoff, Sarah Lichtenstein, Paul Slovic, Steven L. Derby, and Ralph Keeney 1984
American hazardscapes: The regionalization of hazards and disasters Susan L. Cutter 2001
At risk: Natural hazards, people's vulnerability and disasters Piers Blaikie, Terry Cannon, Ian Davis, and Ben Wisner 1994
Big dam foolishness; the problem of modern flood control and water storage Elmer Theodore Peterson 1954
Building Safer Communities. Risk Governance, Spatial Planning and Responses to Natural Hazards Urbano Fra Paleo 2009
Catastropic coastal storms: Hazard mitigation and development management David R. Godschalk, David J. Brower, and Timothy Beatley 1989
Cities on the beach: management issues of developed coastal barriers Rutherford H. Platt, Sheila G. Pelczarski, and Barbara K. Burbank 1987
Cooperating with nature: Confronting natural hazards with land-use planning for sustainable communities Raymond J. Burby 1998
Dangerous earth: An introduction to geologic hazards Barbara W. Murck, Brian J. Skinner, Stephen C. Porter 1998
Disasters and democracy Rutherford H. Platt 1999
Disasters by design: A reassessment of natural hazards in the United States Dennis Mileti 1999
Disasters: The anatomy of environmental hazards John Whittow 1980
Divine wind: The history and science of hurricanes Kerry Emanuel 2005
Earth shock: Hurricanes, volcanoes, earthquakes, tornadoes and other forces of nature Andrew Robinson 1993
Earthquakes: A primer Bruce A. Bolt 1976
Environmental hazards: Assessing risk and reducing disaster Keith Smith 1992
Facing the unexpected: Disaster preparedness and response in the United States Kathleen J. Tierney, Michael K. Lindell, and Ronald W. Perry 2001
Floods Dennis J. Parker 2000
Human adjustment to floods Gilbert F. White 1942
Human System Response to Disaster: An Inventory of Sociological Findings Thomas E. Drabek 1986
Hurricanes : their nature and impacts on society. Roger A. Pielke, Jr. and Roger Pielke, Sr. 1997
Judgment under uncertainty: heuristics and biases Daniel Kahneman, Paul Slovic, and Amos Tversky 1982
Mapping vulnerability: disasters, development, and people Greg Bankoff, Georg Frerks, and Dorothea Hilhorst 2004
Man and Society in Calamity: The Effects of War, Revolution, Famine, Pestilence upon Human Mind, Behavior, Social Organization and Cultural Life Pitirim Sorokin 1942
Mitigation of hazardous comets and asteroids Michael J.S. Belton, Thomas H. Morgan, Nalin H. Samarasinha, Donald K. Yeomans 2005
Mountains of fire: The nature of volcanoes Robert W. Decker, Barbara B. Decker 1991
Natural disasters David Alexander 1993
Natural disasters Patrick L. Abbott 1991
Natural disasters: Protecting vulnerable communities Paul A. Merriman, and C.W. A. Browitt 1993
Natural disaster hotspots: a global risk analysis Maxx Dilley 2005
Natural hazard mitigation: Recasting disaster policy and planning David Godschalk, Timothy Beatley, Philip Berke, David Brower, and Edward J. Kaiser 1999
Natural hazards Edward Bryant 1991
Natural hazards: Earth’s proceses as hazards, disasters, and catastrophes Edward A. Keller, and Robert H. Blodgett 2006
Natural hazards: Explanation and integration Graham A. Tobin, and Burrell E. Montz 1997
Natural hazards: Local, national, global Gilbert F. White 1974
Normal accidents. Living with high-risk technologies Charles Perrow 1984
On borrowed land: Public policies for floodplains Faber Scott 1993
Paying the price: The status and role of insurance against natural disasters in the United States Howard Kunreuther, and Richard J. Roth 1998
Planning for earthquakes: Risks, politics, and policy Philip R. Berke, and Timothy Beatley 1992
Promoting Risk: Constructing the Earthquake Threat Robert Stallings 1995
Recontruction Following Disaster J. Eugene Haas, Robert Kates, and Martyn J. Bowden 1977
Recovery from Natural Disasters: Insurance or Federal Aid? Howard Kunreuther 1973
Reduction and predictability of natural disasters John B. Rundle, William Klein, Don L. Turcotte 1996
Regions of risk: A geographical introduction to disasters Kenneth Hewitt 1997
Risk and culture: An essay on the selection of technical and environmental dangers Mary Douglas, and Aaron Wildavsky 1982
Risk communication: A handbook for communicating environmental, safety, and health risks Regina E. Lundgren, and Andrea H. McMakin 1994
Risk society: Towards a new modernity Ulrich Beck 1992
Risk, environment and modernity: towards a new ecology Scott Lash, Bronislaw Szerszynski and Brian W. Sage 1996
Terra non firma: Understanding and preparing for earthquakes James M. Gere and Haresh M. Shah 1984
The angry earth: Disaster in anthropological perspective Anthony Oliver-Smith, and Susanna Hoffman 1999
The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA Diane Vaughan 1997
The Control of Nature John McPhee 1989
The hurricane and its impact Robert H. Simpson, and Herbert Riehl 1981
The environment as hazard Ian Burton, Robert Kates, and Gilbert F. White 1978
The perception of risk Paul Slovic 2000
The social amplification of risk Nick Pidgeon, Roger E. Kasperson, and Paul Slovic 2003
There is no such thing as a natural disaster : race, class, and Hurricane Katrina Chester W. Hartman, and Gregory D. Squires 2006
Understanding catastrophe: It's impact on life on earth Janine Bourrian 1992
What is a disaster? New answers to old questions Ronald W. Perry, and Enrico Quarantelli 2005
What is a disaster? Perspectives on the question Enrico Quarantelli 1998

See also

See also

References

  1. ^ Oxford English Dictionary
  2. ^ a b c d e Luhmann 1996:3.
  3. ^ James Franklin, 2001: The Science of Conjecture: Evidence and Probability Before Pascal, Baltimore: Johns Hopkins University Press, 274.
  4. ^ Luhmann 1996:4.
  5. ^ Douglas Hubbard The Failure of Risk Management: Why It's Broken and How to Fix It, John Wiley & Sons, 2009.
  6. ^ E.g. "Risk is the unwanted subset of a set of uncertain outcomes." (Cornelius Keating).
  7. ^ a b "An Introduction to Factor Analysis of Information Risk (FAIR)", Risk Management Insight LLC, November 2006;.
  8. ^ Technical Standard Risk Taxonomy ISBN 1-931624-77-1 Document Number: C081 Published by The Open Group, January 2009.
  9. ^ "Risk is a combination of the likelihood of an occurrence of a hazardous event or exposure(s) and the severity of injury or ill health that can be caused by the event or exposure(s)" (OHSAS 18001:2007).
  10. ^ ISO/IEC 27005:2008.
  11. ^ [1].
  12. ^ Flyvbjerg, B., 2008, "Curbing Optimism Bias and Strategic Misrepresentation in Planning: Reference Class Forecasting in Practice." European Planning Studies, vol. 16, no. 1, January, pp. 3-21.
  13. ^ Rychetnik L, Hawe P, Waters E, Barratt A, Frommer M (2004 July). "A glossary for evidence based public health". J Epidemiol Community Health 58: 538–45. doi:10.1136/jech.2003.011585. PMC 1732833. PMID 15194712. http://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1732833. 
  14. ^ Cortada, James W. (2003-12-04) The Digital Hand: How Computers Changed the Work of American Manufacturing, Transportation,​ and Retail Industries USA: Oxford University Press pp. 512 ISBN 0195165888  .
  15. ^ Cortada, James W. (2005-11-03) The Digital Hand: Volume II: How Computers Changed the Work of American Financial, Telecommunicati​ons, Media, and Entertainment Industries USA: Oxford University Press ISBN 978-0195165876  .
  16. ^ Cortada, James W. (2007-11-06) The Digital Hand, Vol 3: How Computers Changed the Work of American Public Sector Industries USA: Oxford University Press pp. 496 ISBN 978-0195165869  .
  17. ^ 44 U.S.C. § 3542(b)(1).
  18. ^ Damodaran, Aswath (2003). Investment Philosophies: Successful Investment Philosophies and the Greatest Investors Who Made Them Work. Wiley. p. 15. ISBN 0-471-34503-2. 
  19. ^ Sapienza P., Zingales L. and Maestripieri D. 2009. Gender differences in financial risk aversion and career choices are affected by testosterone. Proceedings of the National Academy of Sciences.
  20. ^ Apicella C. L. and all. Testosterone and financial risk preferences. Evolution and Human Behavior. Vol 29. Issue 6. 384-390.abstract.
  21. ^ Value at risk.
  22. ^ Julian Talbot and Miles Jakeman Security Risk Management Body of Knowledge, John Wiley & Sons, 2009.
  23. ^ http://flyvbjerg.plan.aau.dk/JAPAASPUBLISHED.pdf.
  24. ^ http://flyvbjerg.plan.aau.dk/Traffic91PRINTJAPA.pdf.
  25. ^ http://flyvbjerg.plan.aau.dk/0406DfT-UK%20OptBiasASPUBL.pdf.
  26. ^ Amos Tversky / Daniel Kahneman, 1981. "The Framing of Decisions and the Psychology of Choice."
  27. ^ Schatz, J., Craft, S., Koby, M., & DeBaun, M. R. (2004). Asymmetries in visual-spatial processing following childhood stroke. Neuropsychology, 18, 340-352.
  28. ^ Volberg, G., & Hubner, R. (2004). On the role of response conflicts and stimulus position for hemispheric differences in global/local processing: An ERP study. Neuropsychologia, 42, 1805-1813.
  29. ^ Drake, R. A. (2004). Selective potentiation of proximal processes: Neurobiological mechanisms for spread of activation. Medical Science Monitor, 10, 231-234.
  30. ^ McElroy, T., & Seta, J. J. (2004). On the other hand am I rational? Hemisphere activation and the framing effect. Brain and Cognition, 55, 572-580.
  31. ^ Flyvbjerg 2006.
  32. ^ Landsburg, Steven (2003-03-03). "Is your life worth $10 million?". Everyday Economics (Slate). http://www.slate.com/id/2079475/. Retrieved 2008-03-17. 
  33. ^ Wired Magazine, Before the levees break, page 3.
  34. ^ A Positive Approach To Risk Requires Person Centred Thinking, Neill et al, Tizard Learning Disability Review http://pierprofessional.metapress.com/content/vr700311x66j0125/
  35. ^ Frank Hyneman Knight "Risk, uncertainty and profit" pg. 19, Hart, Schaffner, and Marx Prize Essays, no. 31. Boston and New York: Houghton Mifflin. 1921.
  36. ^ Douglas Hubbard "How to Measure Anything: Finding the Value of Intangibles in Business" pg. 46, John Wiley & Sons, 2007.
  37. ^ a b Douglas Hubbard "The Failure of Risk Management: Why It's Broken and How to Fix It, John Wiley & Sons, 2009.
  38. ^ Bent Flyvbjerg, Nils Bruzelius, and Werner Rothengatter, 2003, Megaprojects and Risk: An Anatomy of Ambition (Cambridge University Press).

Bibliography

Referred literature

Books

Articles and papers

External links

The Wiktionary entry for risk